Goto

Collaborating Authors

 Tompkins County




On the Convergence to a Global Solution of Shuffling-Type Gradient Algorithms Lam M. Nguyen

Neural Information Processing Systems

Stochastic gradient descent (SGD) algorithm is the method of choice in many machine learning tasks thanks to its scalability and efficiency in dealing with large-scale problems. In this paper, we focus on the shuffling version of SGD which matches the mainstream practical heuristics. We show the convergence to a global solution of shuffling SGD for a class of non-convex functions under over-parameterized settings.



Unsupervised Learning for Solving the Travelling Salesman Problem

Neural Information Processing Systems

We propose UTSP, an Unsupervised Learning (UL) framework for solving the Travelling Salesman Problem (TSP). We train a Graph Neural Network (GNN) using a surrogate loss. The GNN outputs a heat map representing the probability for each edge to be part of the optimal path. We then apply local search to generate our final prediction based on the heat map. Our loss function consists of two parts: one pushes the model to find the shortest path and the other serves as a surrogate for the constraint that the route should form a Hamiltonian Cycle. Experimental results show that UTSP outperforms the existing data-driven TSP heuristics. Our approach is parameter efficient as well as data efficient: the model takes 10% of the number of parameters and 0.2% of training samples compared with Reinforcement Learning or Supervised Learning methods.




Astronomers Are Closing In on the Kuiper Belt's Secrets

WIRED

Astronomers Are Closing In on the Kuiper Belt's Secrets As next-generation telescopes map this outer frontier, astronomers are bracing for discoveries that could reveal hidden planets, strange structures, and clues to the solar system's chaotic youth. Out beyond the orbit of Neptune lies an expansive ring of ancient relics, dynamical enigmas, and possibly a hidden planet--or two. The Kuiper Belt, a region of frozen debris about 30 to 50 times farther from the sun than the Earth is--and perhaps farther, though nobody knows--has been shrouded in mystery since it first came into view in the 1990s. Over the past 30 years, astronomers have cataloged about 4,000 Kuiper Belt objects (KBOs), including a smattering of dwarf worlds, icy comets, and leftover planet parts. But that number is expected to increase tenfold in the coming years as observations from more advanced telescopes pour in.



Provably Reliable Classifier Guidance through Cross-entropy Error Control

Sahu, Sharan, Banerjee, Arisina, Wu, Yuchen

arXiv.org Machine Learning

Classifier-guided diffusion models generate conditional samples by augmenting the reverse-time score with the gradient of a learned classifier, yet it remains unclear whether standard classifier training procedures yield effective diffusion guidance. We address this gap by showing that, under mild smoothness assumptions on the classifiers, controlling the cross-entropy error at each diffusion step also controls the error of the resulting guidance vectors: classifiers achieving conditional KL divergence $\varepsilon^2$ from the ground-truth conditional label probabilities induce guidance vectors with mean squared error $\widetilde{O}(d \varepsilon )$. Our result yields an upper bound on the sampling error under classifier guidance and bears resemblance to a reverse log-Sobolev-type inequality. Moreover, we show that the classifier smoothness assumption is essential, by constructing simple counterexamples demonstrating that, without it, control of the guidance vector can fail for almost all distributions. To our knowledge, our work establishes the first quantitative link between classifier training and guidance alignment, yielding both a theoretical foundation for classifier guidance and principled guidelines for classifier selection.